Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
نویسندگان
چکیده
We provide a framework for computing the exact worst-case performance of any algorithm belonging to a broad class of oracle-based first-order methods for composite convex optimization, including those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. We simultaneously obtain tight worst-case convergence guarantees and explicit problems on which the algorithm reaches this worst-case. We achieve this by reducing the computation of the worst-case to solving a convex semidefinite program, generalizing previous works on performance estimation by Drori and Teboulle [13] and the authors [38]. We use these developments to obtain a tighter analysis of the proximal point algorithm, several variants of fast proximal gradient, conditional gradient, subgradient and alternating projection methods. In particular, we present a new analytical guarantee for the proximal point algorithm that is twice better than previously known, and improve the standard convergence guarantee for the conditional gradient method by more than a factor of two. We also show how the optimized gradient method proposed by Kim and Fessler in [20] can be extended by incorporating a projection or a proximal operator, which leads to an algorithm that converges in the worst-case twice as fast as the standard accelerated proximal gradient method [3].
منابع مشابه
Smooth strongly convex interpolation and exact worst-case performance of first-order methods
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of smooth (strongly) convex functions and initial conditions. We develop c...
متن کاملConvex interpolation and performance estimation of first-order methods for convex optimization
The goal of this thesis is to show how to derive in a completely automated way exact and global worst-case guarantees for first-order methods in convex optimization. To this end, we formulate a generic optimization problem looking for the worst-case scenarios. The worst-case computation problems, referred to as performance estimation problems (PEPs), are intrinsically infinite-dimensional optim...
متن کاملOn the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact...
متن کاملOn the Evaluation Complexity of Composite Function Minimization with Applications to Nonconvex Nonlinear Programming
We estimate the worst-case complexity of minimizing an unconstrained, nonconvex composite objective with a structured nonsmooth term by means of some first-order methods. We find that it is unaffected by the nonsmoothness of the objective in that a first-order trust-region or quadratic regularization method applied to it takes at most O(ǫ−2) function-evaluations to reduce the size of a first-or...
متن کاملFast First-Order Methods for Composite Convex Optimization with Backtracking
We propose new versions of accelerated first order methods for convex composite optimization, where the prox parameter is allowed to increase from one iteration to the next. In particular we show that a full backtracking strategy can be used within the FISTA [1] and FALM algorithms [7] while preserving their worst-case iteration complexities of O( √ L(f)/ ). In the original versions of FISTA an...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 27 شماره
صفحات -
تاریخ انتشار 2017